5 research outputs found

    Moving Sounds Enhance the Visually-Induced Self-Motion Illusion (Circular Vection) in Virtual Reality

    Get PDF
    While rotating visual and auditory stimuli have long been known to elicit self-motion illusions (“circular vection”), audiovisual interactions have hardly been investigated. Here, two experiments investigated whether visually induced circular vection can be enhanced by concurrently rotating auditory cues that match visual landmarks (e.g., a fountain sound). Participants sat behind a curved projection screen displaying rotating panoramic renderings of a market place. Apart from a no-sound condition, headphone-based auditory stimuli consisted of mono sound, ambient sound, or low-/high-spatial resolution auralizations using generic head-related transfer functions (HRTFs). While merely adding nonrotating (mono or ambient) sound showed no effects, moving sound stimuli facilitated both vection and presence in the virtual environment. This spatialization benefit was maximal for a medium (20 degrees × 15 degrees) FOV, reduced for a larger (54 degrees × 45 degrees) FOV and unexpectedly absent for the smallest (10 degrees × 7.5 degrees) FOV. Increasing auralization spatial fidelity (from low, comparable to five-channel home theatre systems, to high, 5 degree resolution) provided no further benefit, suggesting a ceiling effect. In conclusion, both self-motion perception and presence can benefit from adding moving auditory stimuli. This has important implications both for multimodal cue integration theories and the applied challenge of building affordable yet effective motion simulators

    Affective Multimodal Displays: Acoustic Spectra Modulates Perception of Auditory-Tactile Signals

    Get PDF
    Presented at the 14th International Conference on Auditory Display (ICAD2008) on June 24-27, 2008 in Paris, France.Emotional events may interrupt ongoing cognitive processes and automatically grab attention, modulating the subsequent perceptual processes. Hence, emotional eliciting stimuli might effectively be used in warning applications, where a fast and accurate response from users is required. In addition, conveying information through an optimum multisensory combination can lead to a further enhancement of user responses. In the present study we investigated the emotional response to sounds differing in their acoustic spectra, and their influence on speeded detection of auditory-somatosensory stimuli. Higher sound frequencies resulted in an increase in emotional arousal. We suggest that emotional processes might be responsible for the different auditory-somatosensory integration patterns observed for low and high frequency sounds. The presented results might have important implications for the design of auditory and multisensory warning interfaces

    The Effects of Explicit and Implicit Interaction on User Experiences in a Mixed Reality Installation: The Synthetic Oracle

    Get PDF
    Virtual and mixed reality environments (VMRE) often imply full-body human-computer interaction scenarios. We used a public multimodal mixed reality installation, the Synthetic Oracle, and a between-groups design to study the effects of implicit (e.g., passively walking) or explicit (e.g., pointing) interaction modes on the users' emotional and engagement experiences, and we assessed it using questionnaires. Additionally, real-time arm motion data was used to categorize the user behavior and to provide interaction possibilities for the explicit interaction group. The results show that the online behavior classification corresponded well to the users' interaction mode. In addition, contrary to the explicit interaction, the engagement ratings from implicit users were positively correlated with a valence but were uncorrelated with arousal ratings. Interestingly, arousal levels were correlated with different behaviors displayed by the visitors depending on the interaction mode. Hence, this study confirms that the activity level and behavior of users modulates their experience, and that in turn, the interaction mode modulates their behavior. Thus, these results show the importance of the selected interaction mode when designing users' experiences in VMRE

    Sonification of Brain and Body Signals in Collaborative Tasks Using a Tabletop Musical Interface

    Get PDF
    Presented at the 17th International Conference on Auditory Display (ICAD2011), 20-23 June, 2011 in Budapest, Hungary.Physiological Computing has been applied in different disciplines, and is becoming popular and widespread in Human-Computer In- teraction, due to device miniaturization and improvements in real- time processing. However, most of the studies on physiology- based interfaces focus on single-user systems, while their use in Computer-Supported Collaborative Work (CSCW) is still emerg- ing. The present work explores how sonification of human brain and body signals can enhance user experience in collaborative mu- sic composition. For this task, a novel multimodal interactive sys- tem is built using a musical tabletop interface (Reactable) and a hybrid Brain-Computer Interface (BCI). The described system al- lows performers to generate and control sounds using their own or their fellow team member’s physiology. Recently, we assessed this physiology-based collaboration system in a pilot experiment. Dis- cussion on the results and future work on new sonifications will be accompanied by practical demonstration during the conference

    Travelling without moving: Auditory scene cues for translational self-motion

    Get PDF
    Presented at the 11th International Conference on Auditory Display (ICAD2005)Creating a sense of illusory self-motion is crucial for many Virtual Reality applications and the auditory modality is an essential, but often neglected, component for such stimulations. In this paper, perceptual optimization of auditory-induced, translational self-motion (vection) simulation is studied using binaurally synthesized and reproduced sound fields. The results suggest that auditory scene consistency and ecologically validity makes a minimum set of acoustic cues sufficient for eliciting auditory-induced vection. Specifically, it was found that a focused attention task and sound objects' motion characteristics (approaching or receding) play an important role in self-motion perception. In addition, stronger sensations for auditory induced self-translation than for previously investigated self-rotation also suggest a strong ecological validity bias, as translation is the most common movement direction
    corecore